Newton-Type Methods for Optimization Problems without Constraint Qualifications

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Newton-Type Methods for Optimization Problems without Constraint Qualifications

We consider equality-constrained optimization problems, where a given solution may not satisfy any constraint qualification but satisfies the standard second-order sufficient condition for optimality. Based on local identification of the rank of the constraints degeneracy via the singular-value decomposition, we derive a modified primal-dual optimality system whose solution is locally unique, n...

متن کامل

Stabilized Sequential Quadratic Programming for Optimization and a Stabilized Newton-type Method for Variational Problems without Constraint Qualifications∗

The stabilized version of the sequential quadratic programming algorithm (sSQP) had been developed in order to achieve fast convergence despite possible degeneracy of constraints of optimization problems, when the Lagrange multipliers associated to a solution are not unique. Superlinear convergence of sSQP had been previously established under the secondorder sufficient condition for optimality...

متن کامل

Optimal Newton-type methods for nonconvex smooth optimization problems

We consider a general class of second-order iterations for unconstrained optimization that includes regularization and trust-region variants of Newton’s method. For each method in this class, we exhibit a smooth, bounded-below objective function, whose gradient is globally Lipschitz continuous within an open convex set containing any iterates encountered and whose Hessian is α−Hölder continuous...

متن کامل

On constraint qualifications in directionally differentiable multiobjective optimization problems

We consider a multiobjective optimization problem with a feasible set defined by inequality and equality constraints such that all functions are, at least, Dini differentiable (in some cases, Hadamard differentiable and sometimes, quasiconvex). Several constraint qualifications are given in such a way that generalize both the qualifications introduced by Maeda and the classical ones, when the f...

متن کامل

Proximal Newton-type methods for convex optimization

We seek to solve convex optimization problems in composite form: minimize x∈Rn f(x) := g(x) + h(x), where g is convex and continuously differentiable and h : R → R is a convex but not necessarily differentiable function whose proximal mapping can be evaluated efficiently. We derive a generalization of Newton-type methods to handle such convex but nonsmooth objective functions. We prove such met...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SIAM Journal on Optimization

سال: 2004

ISSN: 1052-6234,1095-7189

DOI: 10.1137/s1052623403427264